# Efficient inference with distilled models
Nllb
NLLB-200 distilled version with 600M parameters, supporting machine translation tasks for 200 languages
Machine Translation
Transformers Supports Multiple Languages

N
Narsil
113
2
Emotion English Distilroberta Base
A fine-tuned English text emotion classification model based on DistilRoBERTa-base, capable of predicting Ekman's six basic emotions and neutral category.
Text Classification
Transformers English

E
j-hartmann
1.1M
402
Featured Recommended AI Models